Iteratively-Reweighted Least-Squares Fitting of Support Vector Machines: A Majorization-Minimization Algorithm Approach
نویسندگان
چکیده
Support vector machines (SVMs) are an important tool in modern data analysis. Traditionally, support vector machines have been fitted via quadratic programming, either using purpose-built or off-the-shelf algorithms. We present an alternative approach to SVM fitting via the majorization–minimization (MM) paradigm. Algorithms that are derived via MM algorithm constructions can be shown to monotonically decrease their objectives at each iteration, as well as be globally convergent to stationary points. We demonstrate the construction of iteratively-reweighted least-squares (IRLS) algorithms, via the MM paradigm, for SVM risk minimization problems involving the hinge, least-square, squared-hinge, and logistic losses, and 1-norm, 2-norm, and elastic net penalizations. Successful implementations of our algorithms are presented via some numerical examples. ∗HDN is at the Department of Mathematics and Statistics, La Trobe University, Bundoora Victoria, Australia 3086 (email: [email protected]). GJM is at the School of Mathematics and Statistics, University of Queensland, St. Lucia Queensland, Australia 4072. 1 ar X iv :1 70 5. 04 65 1v 1 [ st at .C O ] 1 2 M ay 2 01 7
منابع مشابه
Convergence Analysis of Generalized Iteratively Reweighted Least Squares Algorithms on Convex Function Spaces
The computation of robust regression estimates often relies on minimization of a convex functional on a convex set. In this paper we discuss a general technique for a large class of convex functionals to compute the minimizers iteratively which is closely related to majorization-minimization algorithms. Our approach is based on a quadratic approximation of the functional to be minimized and inc...
متن کاملAdaptive total variation image deblurring: A majorization-minimization approach
This paper presents a new approach to total variation (TV) based image deconvolution/deblurring, which is adaptive in the sense that it doesn’t require the user to specify the value of the regularization parameter. We follow the Bayesian approach of integrating out this parameter, which is achieved by using an approximation of the partition function of the probabilistic prior interpretation of ...
متن کاملFast Logistic Regression for Data Mining, Text Classification and Link Detection
Previous work by the authors [1] demonstrated that logistic regression can be a fast and accurate data mining tool for life sciences datasets, competitive with modern tools like support vector machines and balltree based K-NN. This paper has two objectives. The first objective is a serious empirical comparison of logistic regression to several classical and modern learners on a variety of learn...
متن کاملModel Selection for Kernel Probit Regression
The convex optimisation problem involved in fitting a kernel probit regression (KPR) model can be solved efficiently via an iteratively re-weighted least-squares (IRWLS) approach. The use of successive quadratic approximations of the true objective function suggests an efficient approximate form of leave-one-out cross-validation for KPR, based on an existing exact algorithm for the weighted lea...
متن کاملMajorization-Minimization Algorithms for Wavelet-Based Image Restoration
Standard formulations of image/signal deconvolution under wavelet-based priors/regularizers lead to very high-dimensional optimization problems involving the following difficulties: the non-Gaussian (heavy-tailed) wavelet priors lead to objective functions which are nonquadratic, usually nondifferentiable, and sometimes even nonconvex; the presence of the convolution operator destroys the separ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1705.04651 شماره
صفحات -
تاریخ انتشار 2017